A PAC Bound for Approximate Support Vector Machines

نویسندگان

  • Dongwei Cao
  • Daniel Boley
چکیده

We study a class of algorithms that speed up the training process of support vector machines (SVMs) by returning an approximate SVM. We focus on algorithms that reduce the size of the optimization problem by extracting from the original training dataset a small number of representatives and using these representatives to train an approximate SVM. The main contribution of this paper is a PAC-style generalization bound for the resulting approximate SVM, which provides a learning theoretic justification for using the approximate SVM in practice. The proved bound also generalizes and includes as a special case the generalization bound for the exact SVM, which denotes the SVM given by the original training dataset in this paper. Keyword: Support Vector Machines, Approximate Solutions, Generalization Bounds, Algorithmic Stability

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

PAC-Bayesian Theorems for Gaussian Process Classification

We present distribution-free generalization error bounds which apply to a wide class of approximate Bayesian Gaussian process classification (GPC) techniques, powerful nonparametric learning methods similar to Support Vector machines. The bounds use the PACBayesian theorem [8] for which we provide a simplified proof, leading to new insights into its relation to traditional VC type union bound t...

متن کامل

Considering Span of Support Vector Bounds in the Context of Computational Learning Theory

This report describes a recent bound on expected support vector machine (SVM) generalization error [3] and frames this work in the context of computational learning theory [1], discussing the practical value of these bounds and the value to our mathematical understanding of machine learning. The fundamentals of computational learning theory are first outlined: PAC learning is defined, an exampl...

متن کامل

Simplified PAC-Bayesian Margin Bounds

The theoretical understanding of support vector machines is largely based on margin bounds for linear classifiers with unit-norm weight vectors and unit-norm feature vectors. Unit-norm margin bounds have been proved previously using fat-shattering arguments and Rademacher complexity. Recently Langford and Shawe-Taylor proved a dimensionindependent unit-norm margin bound using a relatively simpl...

متن کامل

A Comparative Study of Extreme Learning Machines and Support Vector Machines in Prediction of Sediment Transport in Open Channels

The limiting velocity in open channels to prevent long-term sedimentation is predicted in this paper using a powerful soft computing technique known as Extreme Learning Machines (ELM). The ELM is a single Layer Feed-forward Neural Network (SLFNN) with a high level of training speed. The dimensionless parameter of limiting velocity which is known as the densimetric Froude number (Fr) is predicte...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007